110 research outputs found

    Z Distance Function for KNN Classification

    Full text link
    This paper proposes a new distance metric function, called Z distance, for KNN classification. The Z distance function is not a geometric direct-line distance between two data points. It gives a consideration to the class attribute of a training dataset when measuring the affinity between data points. Concretely speaking, the Z distance of two data points includes their class center distance and real distance. And its shape looks like "Z". In this way, the affinity of two data points in the same class is always stronger than that in different classes. Or, the intraclass data points are always closer than those interclass data points. We evaluated the Z distance with experiments, and demonstrated that the proposed distance function achieved better performance in KNN classification

    Benign Overfitting in Classification: Provably Counter Label Noise with Larger Models

    Full text link
    Studies on benign overfitting provide insights for the success of overparameterized deep learning models. In this work, we examine whether overfitting is truly benign in real-world classification tasks. We start with the observation that a ResNet model overfits benignly on Cifar10 but not benignly on ImageNet. To understand why benign overfitting fails in the ImageNet experiment, we theoretically analyze benign overfitting under a more restrictive setup where the number of parameters is not significantly larger than the number of data points. Under this mild overparameterization setup, our analysis identifies a phase change: unlike in the previous heavy overparameterization settings, benign overfitting can now fail in the presence of label noise. Our analysis explains our empirical observations, and is validated by a set of control experiments with ResNets. Our work highlights the importance of understanding implicit bias in underfitting regimes as a future direction.Comment: Published as a conference paper at ICLR 202

    FuzzySkyline: QoS-Aware Fuzzy Skyline Parking Recommendation Using Edge Traffic Facilities

    Get PDF

    Unsupervised Feature Selection Algorithm via Local Structure Learning and Kernel Function

    Get PDF
    In order to reduce dimensionality of high-dimensional data, a series of feature selection algorithms have been proposed. But these algorithms have the following disadvantages: (1) they do not fully consider the nonlinear relationship between data features (2) they do not consider the similarity between data features. To solve the above two problems, we propose an unsupervised feature selection algorithm based on local structure learning and kernel function. First, through the kernel function, we map each feature of the data to the kernel space, so that the nonlinear relationship of the data features can be fully exploited. Secondly, we apply the theory of local structure learning to the features of data, so that the similarity of data features is considered. Then we added a low rank constraint to consider the global information of the data. Finally, we add sparse learning to make feature selection. The experimental results show that the proposed algorithm has better results than the comparison methods

    Predictive Inference with Feature Conformal Prediction

    Full text link
    Conformal prediction is a distribution-free technique for establishing valid prediction intervals. Although conventionally people conduct conformal prediction in the output space, this is not the only possibility. In this paper, we propose feature conformal prediction, which extends the scope of conformal prediction to semantic feature spaces by leveraging the inductive bias of deep representation learning. From a theoretical perspective, we demonstrate that feature conformal prediction provably outperforms regular conformal prediction under mild assumptions. Our approach could be combined with not only vanilla conformal prediction, but also other adaptive conformal prediction methods. Apart from experiments on existing predictive inference benchmarks, we also demonstrate the state-of-the-art performance of the proposed methods on large-scale tasks such as ImageNet classification and Cityscapes image segmentation.The code is available at \url{https://github.com/AlvinWen428/FeatureCP}.Comment: Published as a conference paper at ICLR 202

    NUMERICAL STUDY ON PROPULSIVE FACTORS IN REGULAR HEAD AND OBLIQUE WAVES

    Get PDF
    This paper applies Reynolds-averaged Navier-Stokes (RANS) method to study propulsion performance in head and oblique waves. Finite volume method (FVM) is employed to discretize the governing equations and SST k-ω model is used for modeling the turbulent flow. The free surface is solved by volume of fluid (VOF) method. Sliding mesh technique is used to enable rotation of propeller. Propeller open water curves are determined by propeller open water simulations. Calm water resistance and wave added resistances are obtained from towing computations without propeller. Self-propulsion simulations in calm water and waves with varying loads are performed to obtain self-propulsion point and thrust identify method is use to predict propulsive factors. Regular head waves with wavelengths varying from 0.6 to 1.4 times the length of ship and oblique waves with incident directions varying from 0° to 360° are considered. The influence of waves on propulsive factors, including thrust deduction and wake fraction, open water, relative rotative, hull and propulsive efficiencies are discussed

    Achieving Strong Chemical Interface and Superior Energy-Saving Capability at the Crosslinks of Rubber Composites Containing Graphene Oxide Using Thiol-Vinyl Click Chemistry

    Get PDF
    Rapidly developments in international transportation inevitably lead to an increase in the consumption of energy and resources. Minimizing the rolling resistance of tires in this scenario is a pressing challenge. To lower the rolling resistance of tires, enhancing the interaction between fillers and rubber molecules while improving the dispersion of fillers are required to reduce the internal mutual friction and viscous loss of rubber composites. In this study, graphene oxide (GO) was modified using γ-mercaptopropyltrimethoxysilane (MPTMS) with thiol groups. A modified GO/natural rubber (MGO/NR) masterbatch with a fine dispersion of MGO was then introduced into solution-polymerized styrene butadiene rubber (SSBR) to create an MGO/SiO2/SSBR composite. During the crosslinking process at high temperatures, a strong chemical interface interaction between the MGO and rubber molecules was formed by the thiol-vinyl click reaction. The MGO sheets also act as crosslinks to enhance the crosslinking network. The results showed that the rolling resistance of the MGO SiO2/SSBR composite was superior by 19.4% and the energy loss was reduced by 15.7% compared with that of the base SiO2/SSBR composite. Strikingly, the wear performance and wet skid resistance improved by 19% and 17.3%, respectively. These results showed a strong interface that not only improved rolling resistance performance but also contributed to balancing the “magic triangle” (the combination of wear resistance, fuel efficiency, and traction) properties of tires

    Sparse Nonlinear Feature Selection Algorithm via Local Structure Learning

    Get PDF
    In this paper, we propose a new unsupervised feature selection algorithm by considering the nonlinear and similarity relationships within the data. To achieve this, we apply the kernel method and local structure learning to consider the nonlinear relationship between features and the local similarity between features. Specifically, we use a kernel function to map each feature of the data into the kernel space. In the high-dimensional kernel space, different features correspond to different weights, and zero weights are unimportant features (e.g. redundant features). Furthermore, we consider the similarity between features through local structure learning, and propose an effective optimization method to solve it. The experimental results show that the proposed algorithm achieves better performance than the comparison algorithm
    corecore